Moderately clipped LASSO for the high-dimensional generalized linear model

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

High - Dimensional Generalized Linear Models and the Lasso

We consider high-dimensional generalized linear models with Lipschitz loss functions, and prove a nonasymptotic oracle inequality for the empirical risk minimizer with Lasso penalty. The penalty is based on the coefficients in the linear predictor, after normalization with the empirical norm. The examples include logistic regression, density estimation and classification with hinge loss. Least ...

متن کامل

Group Lasso for generalized linear models in high dimension

We present a Group Lasso procedure for generalized linear models (GLMs) and we study the properties of this estimator applied to sparse high-dimensional GLMs. Under general conditions on the joint distribution of the pair observable covariates, we provide oracle inequalities promoting group sparsity of the covariables. We get convergence rates for the prediction and estimation error and we show...

متن کامل

Generalized orthogonal components regression for high dimensional generalized linear models

Here we propose an algorithm, named generalized orthogonal components regression (GOCRE), to explore the relationship between a categorical outcome and a set of massive variables. A set of orthogonal components are sequentially constructed to account for the variation of the categorical outcome, and together build up a generalized linear model (GLM). This algorithm can be considered as an exten...

متن کامل

Non-asymptotic Oracle Inequalities for the Lasso and Group Lasso in high dimensional logistic model

We consider the problem of estimating a function f0 in logistic regression model. We propose to estimate this function f0 by a sparse approximation build as a linear combination of elements of a given dictionary of p functions. This sparse approximation is selected by the Lasso or Group Lasso procedure. In this context, we state non asymptotic oracle inequalities for Lasso and Group Lasso under...

متن کامل

High-Dimensional Feature Selection by Feature-Wise Non-Linear Lasso

The goal of supervised feature selection is to find a subset of input features that are responsible for predicting output values. The least absolute shrinkage and selection operator (Lasso) allows computationally efficient feature selection based on linear dependency between input features and output values. In this paper, we consider a feature-wise kernelized Lasso for capturing non-linear inp...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Communications for Statistical Applications and Methods

سال: 2020

ISSN: 2383-4757

DOI: 10.29220/csam.2020.27.4.445